and (ݔଷ, ݕଷ) as shown in Table 4.1. If a regression model has been
ed for them, there will be three model outputs as well. Note that
d y are experimental data, but is not. For each pair,
ݕො
ݕො takes three
r three values of x, i.e., ߙ ߚݔଵ, or ߙ ߚݔଶ or ߙ ߚݔଷ. The
n model outputs are individually denoted by ݕොଵ, ݕොଶ and ݕොଷ for
ݕො is a vector of the regressed means. The final regression
s an interpolated curve based on these regressed means. Between
there are three regression errors. They are denoted by ߝଵ, ߝଶand
squared sum is called the sum of squared error or the total
n error and is denoted by ߝൌߝଵ
ଶߝଶ
ଶ ߝଷ
ଶ in this case.
The correspondence between an independent variable x, a dependent variable y,
outputs (the regressed means) and the regression errors. The squared errors are
ed.
Observation
Model output
Error
Squared error
ݕଵ
ݕොଵ
ߝଵൌݕଵെݕොଵ
ߝଵ
ଶ
ݕଶ
ݕොଶ
ߝଶൌݕଶെݕොଶ
ߝଶ
ଶ
ݕଷ
ݕොଷ
ߝଷൌݕଷെݕොଷ
ߝଷ
ଶ
he illustration of the regression errors. The open dots apart from the solid line
e observations (y). The filled dots on the solid line stand for the regressed means
ns or model outputs (ݕො). The solid line stands for the regression function, i.e.,
ated function based on the regressed means. The vertical dashed lines stand for
on errors (distances) between the observations and the regressed means.
e 4.5 shows an example where there are seven data points, seven
tputs (predictions or regressed means) and seven errors between